2024-03-19 18:20:18,446 [ 345587 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy (runner:41, check_args_and_update_paths) 2024-03-19 18:20:18,446 [ 345587 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration (runner:89, check_args_and_update_paths) 2024-03-19 18:20:18,446 [ 345587 ] INFO : src dir is not set. Will use /home/ubuntu/_work/_temp/test/git-repo-copy/src (runner:96, check_args_and_update_paths) 2024-03-19 18:20:18,446 [ 345587 ] INFO : base_configs_dir: /home/ubuntu/_work/_temp/test/git-repo-copy/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration (runner:98, check_args_and_update_paths) clickhouse_integration_tests_volume WARNING: Ignoring custom format, because both --format and --quiet are set. Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_d2wr8n --privileged --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/src/Server/grpc_protos:/ClickHouse/src/Server/grpc_protos --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e XTABLES_LOCKFILE=/run/host/xtables.lock -e PYTHONUNBUFFERED=1 -e DOCKER_DOTNET_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_HELPER_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_BASE_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBERIZED_HADOOP_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBEROS_KDC_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JS_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_PHP_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e PYTEST_OPTS='--dist=loadfile -n 10 -rfEps --run-id=2 --color=no --durations=0 test_tlsv1_3/test.py::test_create_user test_tlsv1_3/test.py::test_https test_tlsv1_3/test.py::test_https_non_ssl_auth test_tlsv1_3/test.py::test_https_wrong_cert -vvv' altinityinfra/integration-tests-runner:0-0a8ac3b092733da37e3e2a0079c486938a36790d '. Start tests ============================= test session starts ============================== platform linux -- Python 3.8.10, pytest-8.0.2, pluggy-1.4.0 -- /usr/bin/python3 cachedir: .pytest_cache rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: repeat-0.9.3, xdist-3.5.0, random-0.2, timeout-2.2.0, order-1.0.0 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [4 items] scheduling tests via LoadFileScheduling test_tlsv1_3/test.py::test_create_user [gw0] [ 25%] FAILED test_tlsv1_3/test.py::test_create_user test_tlsv1_3/test.py::test_https [gw0] [ 50%] FAILED test_tlsv1_3/test.py::test_https test_tlsv1_3/test.py::test_https_non_ssl_auth [gw0] [ 75%] FAILED test_tlsv1_3/test.py::test_https_non_ssl_auth test_tlsv1_3/test.py::test_https_wrong_cert [gw0] [100%] FAILED test_tlsv1_3/test.py::test_https_wrong_cert =================================== FAILURES =================================== _______________________________ test_create_user _______________________________ [gw0] linux -- Python 3.8.10 /usr/bin/python3 self = http_class = req = http_conn_args = {'check_hostname': None, 'context': } host = '172.16.1.2:8443' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.8/urllib/request.py:1354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.8/http/client.py:1256: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.8/http/client.py:1302: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.8/http/client.py:1251: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.8/http/client.py:1011: in _send_output self.send(msg) /usr/lib/python3.8/http/client.py:951: in send self.connect() /usr/lib/python3.8/http/client.py:1425: in connect self.sock = self._context.wrap_socket(self.sock, helpers/ssl_context.py:12: in wrap_socket return super().wrap_socket(sock, *args, **kwargs) /usr/lib/python3.8/ssl.py:500: in wrap_socket return self.sslsocket_class._create( /usr/lib/python3.8/ssl.py:1069: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:1131) /usr/lib/python3.8/ssl.py:1338: SSLEOFError During handling of the above exception, another exception occurred: def test_create_user(): instance.query("CREATE USER emma IDENTIFIED WITH ssl_certificate CN 'client3'") > assert ( execute_query_https("SELECT currentUser()", user="emma", cert_name="client3") == "emma\n" ) test_tlsv1_3/test.py:206: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_tlsv1_3/test.py:64: in execute_query_https response = urllib.request.urlopen( /usr/lib/python3.8/urllib/request.py:222: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.8/urllib/request.py:525: in open response = self._open(req, data) /usr/lib/python3.8/urllib/request.py:542: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.8/urllib/request.py:502: in _call_chain result = func(*args) /usr/lib/python3.8/urllib/request.py:1397: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': } host = '172.16.1.2:8443' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.8/urllib/request.py:1357: URLError ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2024-03-19 18:20:23 [ 387 ] DEBUG : Command:['docker ps | wc -l'] (cluster.py:97, run_and_check) 2024-03-19 18:20:23 [ 387 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) 2024-03-19 18:20:23 [ 387 ] DEBUG : No running containers (conftest.py:44, cleanup_environment) 2024-03-19 18:20:23 [ 387 ] INFO : Running tests in /ClickHouse/tests/integration/test_tlsv1_3/test.py (cluster.py:2508, start) 2024-03-19 18:20:23 [ 387 ] DEBUG : Cluster start called. is_up=False (cluster.py:2515, start) 2024-03-19 18:20:23 [ 387 ] DEBUG : Docker networks for project roottesttlsv13 are NETWORK ID NAME DRIVER SCOPE (cluster.py:633, print_all_docker_pieces) 2024-03-19 18:20:23 [ 387 ] DEBUG : Docker containers for project roottesttlsv13 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:641, print_all_docker_pieces) 2024-03-19 18:20:23 [ 387 ] DEBUG : Docker volumes for project roottesttlsv13 are DRIVER VOLUME NAME (cluster.py:649, print_all_docker_pieces) 2024-03-19 18:20:23 [ 387 ] DEBUG : Cleanup called (cluster.py:654, cleanup) 2024-03-19 18:20:23 [ 387 ] DEBUG : Docker networks for project roottesttlsv13 are NETWORK ID NAME DRIVER SCOPE (cluster.py:633, print_all_docker_pieces) 2024-03-19 18:20:23 [ 387 ] DEBUG : Docker containers for project roottesttlsv13 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:641, print_all_docker_pieces) 2024-03-19 18:20:23 [ 387 ] DEBUG : Docker volumes for project roottesttlsv13 are DRIVER VOLUME NAME (cluster.py:649, print_all_docker_pieces) 2024-03-19 18:20:23 [ 387 ] DEBUG : Command:docker container list --all --filter name='^/roottesttlsv13_.*_1$' --format '{{.ID}}:{{.Names}}' (cluster.py:97, run_and_check) 2024-03-19 18:20:23 [ 387 ] DEBUG : Unstopped containers: {} (cluster.py:668, cleanup) 2024-03-19 18:20:23 [ 387 ] DEBUG : No running containers for project: roottesttlsv13 (cluster.py:682, cleanup) 2024-03-19 18:20:23 [ 387 ] DEBUG : Trying to prune unused networks... (cluster.py:688, cleanup) 2024-03-19 18:20:23 [ 387 ] DEBUG : Trying to prune unused images... (cluster.py:704, cleanup) 2024-03-19 18:20:23 [ 387 ] DEBUG : Command:['docker', 'image', 'prune', '-f'] (cluster.py:97, run_and_check) 2024-03-19 18:20:23 [ 387 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:105, run_and_check) 2024-03-19 18:20:23 [ 387 ] DEBUG : Images pruned (cluster.py:707, cleanup) 2024-03-19 18:20:23 [ 387 ] DEBUG : Trying to prune unused volumes... (cluster.py:713, cleanup) 2024-03-19 18:20:23 [ 387 ] DEBUG : Command:['docker volume ls | wc -l'] (cluster.py:97, run_and_check) 2024-03-19 18:20:23 [ 387 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) 2024-03-19 18:20:23 [ 387 ] DEBUG : Setup directory for instance: node (cluster.py:2528, start) 2024-03-19 18:20:23 [ 387 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4146, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Create directory for common tests configuration (cluster.py:4151, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Copy common configuration from helpers (cluster.py:4171, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Generate and write macros file (cluster.py:4184, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_tlsv1_3/configs/ssl_config.xml', '/ClickHouse/tests/integration/test_tlsv1_3/certs/server-key.pem', '/ClickHouse/tests/integration/test_tlsv1_3/certs/server-cert.pem', '/ClickHouse/tests/integration/test_tlsv1_3/certs/ca-cert.pem', '/ClickHouse/tests/integration/test_tlsv1_3/certs/dhparam4096.pem'] to /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/configs/config.d (cluster.py:4215, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/database (cluster.py:4232, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/logs (cluster.py:4243, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log"] (cluster.py:4319, create_dir) 2024-03-19 18:20:23 [ 387 ] DEBUG : Env {'TSAN_OPTIONS': 'second_deadlock_stack=1', 'ASAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw'} stored in /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/.env (cluster.py:70, _create_env_file) 2024-03-19 18:20:23 [ 387 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2024-03-19 18:20:23 [ 387 ] DEBUG : No config file found (config.py:28, find_config_file) 2024-03-19 18:20:23 [ 387 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2024-03-19 18:20:23 [ 387 ] DEBUG : No config file found (config.py:28, find_config_file) 2024-03-19 18:20:23 [ 387 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 824 (connectionpool.py:429, _make_request) 2024-03-19 18:20:23 [ 387 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/.env', '--project-name', 'roottesttlsv13', '--file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/docker-compose.yml', 'pull'] (cluster.py:97, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Pulling node ... (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Pulling node ... pulling from altinityinfra/integr... (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Pulling node ... digest: sha256:5a6b09e905506d4aea... (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Pulling node ... status: image is up to date for a... (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Pulling node ... done (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker-compose --env-file /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/.env --project-name roottesttlsv13 --file /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/docker-compose.yml up -d --no-recreate') (cluster.py:2852, start) 2024-03-19 18:20:35 [ 387 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/.env', '--project-name', 'roottesttlsv13', '--file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/docker-compose.yml', 'up', '-d', '--no-recreate'] (cluster.py:97, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Creating network "roottesttlsv13_default" with the default driver (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Creating roottesttlsv13_node_1 ... (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : Stderr:Creating roottesttlsv13_node_1 ... done (cluster.py:107, run_and_check) 2024-03-19 18:20:35 [ 387 ] DEBUG : ClickHouse instance created (cluster.py:2860, start) 2024-03-19 18:20:35 [ 387 ] DEBUG : get_instance_ip instance_name=node (cluster.py:1851, get_instance_ip) 2024-03-19 18:20:35 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/roottesttlsv13_node_1/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:35 [ 387 ] DEBUG : Waiting for ClickHouse start in node, ip: 172.16.1.2... (cluster.py:2867, start) 2024-03-19 18:20:35 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/roottesttlsv13_node_1/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:35 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:35 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:35 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:36 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:37 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:38 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:38 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:38 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:38 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:38 [ 387 ] DEBUG : http://localhost:None "GET /v1.44/containers/5830cc29ba75425fbc135d3f535051c0e98c89275dcc9d3594f1a3bc6fe1720d/json HTTP/1.1" 200 None (connectionpool.py:429, _make_request) 2024-03-19 18:20:38 [ 387 ] DEBUG : ClickHouse node started (cluster.py:2871, start) ------------------------------ Captured log call ------------------------------- 2024-03-19 18:20:38 [ 387 ] DEBUG : Executing query CREATE USER emma IDENTIFIED WITH ssl_certificate CN 'client3' on node (cluster.py:3300, query) __________________________________ test_https __________________________________ [gw0] linux -- Python 3.8.10 /usr/bin/python3 self = http_class = req = http_conn_args = {'check_hostname': None, 'context': } host = '172.16.1.2:8443' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.8/urllib/request.py:1354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.8/http/client.py:1256: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.8/http/client.py:1302: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.8/http/client.py:1251: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.8/http/client.py:1011: in _send_output self.send(msg) /usr/lib/python3.8/http/client.py:951: in send self.connect() /usr/lib/python3.8/http/client.py:1425: in connect self.sock = self._context.wrap_socket(self.sock, helpers/ssl_context.py:12: in wrap_socket return super().wrap_socket(sock, *args, **kwargs) /usr/lib/python3.8/ssl.py:500: in wrap_socket return self.sslsocket_class._create( /usr/lib/python3.8/ssl.py:1069: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:1131) /usr/lib/python3.8/ssl.py:1338: SSLEOFError During handling of the above exception, another exception occurred: def test_https(): > assert ( execute_query_https("SELECT currentUser()", user="john", cert_name="client1") == "john\n" ) test_tlsv1_3/test.py:71: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_tlsv1_3/test.py:64: in execute_query_https response = urllib.request.urlopen( /usr/lib/python3.8/urllib/request.py:222: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.8/urllib/request.py:525: in open response = self._open(req, data) /usr/lib/python3.8/urllib/request.py:542: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.8/urllib/request.py:502: in _call_chain result = func(*args) /usr/lib/python3.8/urllib/request.py:1397: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': } host = '172.16.1.2:8443' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.8/urllib/request.py:1357: URLError ___________________________ test_https_non_ssl_auth ____________________________ [gw0] linux -- Python 3.8.10 /usr/bin/python3 self = http_class = req = http_conn_args = {'check_hostname': None, 'context': } host = '172.16.1.2:8443' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.8/urllib/request.py:1354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.8/http/client.py:1256: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.8/http/client.py:1302: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.8/http/client.py:1251: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.8/http/client.py:1011: in _send_output self.send(msg) /usr/lib/python3.8/http/client.py:951: in send self.connect() /usr/lib/python3.8/http/client.py:1425: in connect self.sock = self._context.wrap_socket(self.sock, helpers/ssl_context.py:12: in wrap_socket return super().wrap_socket(sock, *args, **kwargs) /usr/lib/python3.8/ssl.py:500: in wrap_socket return self.sslsocket_class._create( /usr/lib/python3.8/ssl.py:1069: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:1131) /usr/lib/python3.8/ssl.py:1338: SSLEOFError During handling of the above exception, another exception occurred: def test_https_non_ssl_auth(): # Users with non-SSL authentication are allowed, in this case we can skip sending a client certificate at all (because "verificationMode" is set to "relaxed"). # assert execute_query_https("SELECT currentUser()", user="peter", enable_ssl_auth=False) == "peter\n" > assert ( execute_query_https( "SELECT currentUser()", user="jane", enable_ssl_auth=False, password="qwe123", ) == "jane\n" ) test_tlsv1_3/test.py:114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_tlsv1_3/test.py:64: in execute_query_https response = urllib.request.urlopen( /usr/lib/python3.8/urllib/request.py:222: in urlopen return opener.open(url, data, timeout) /usr/lib/python3.8/urllib/request.py:525: in open response = self._open(req, data) /usr/lib/python3.8/urllib/request.py:542: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.8/urllib/request.py:502: in _call_chain result = func(*args) /usr/lib/python3.8/urllib/request.py:1397: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'check_hostname': None, 'context': } host = '172.16.1.2:8443' h = def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.8/urllib/request.py:1357: URLError ____________________________ test_https_wrong_cert _____________________________ [gw0] linux -- Python 3.8.10 /usr/bin/python3 def test_https_wrong_cert(): # Wrong certificate: different user's certificate with pytest.raises(Exception) as err: execute_query_https("SELECT currentUser()", user="john", cert_name="client2") > assert "HTTP Error 403" in str(err.value) E AssertionError: assert 'HTTP Error 403' in '' E + where '' = str(URLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1131)'))) E + where URLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1131)')) = .value test_tlsv1_3/test.py:89: AssertionError ---------------------------- Captured log teardown ----------------------------- 2024-03-19 18:20:39 [ 387 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/.env', '--project-name', 'roottesttlsv13', '--file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/docker-compose.yml', 'stop', '--timeout', '20'] (cluster.py:97, run_and_check) 2024-03-19 18:20:41 [ 387 ] DEBUG : Stderr:Stopping roottesttlsv13_node_1 ... (cluster.py:107, run_and_check) 2024-03-19 18:20:41 [ 387 ] DEBUG : Stderr:Stopping roottesttlsv13_node_1 ... done (cluster.py:107, run_and_check) 2024-03-19 18:20:41 [ 387 ] DEBUG : Command:['bash', '-c', '[ -f /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/logs/stderr.log* || true'] (cluster.py:97, run_and_check) 2024-03-19 18:20:41 [ 387 ] DEBUG : Command:['docker-compose', '--env-file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/.env', '--project-name', 'roottesttlsv13', '--file', '/ClickHouse/tests/integration/test_tlsv1_3/_instances_2/node/docker-compose.yml', 'down', '--volumes'] (cluster.py:97, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Stderr:Removing roottesttlsv13_node_1 ... (cluster.py:107, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Stderr:Removing roottesttlsv13_node_1 ... done (cluster.py:107, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Stderr:Removing network roottesttlsv13_default (cluster.py:107, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Cleanup called (cluster.py:654, cleanup) 2024-03-19 18:20:42 [ 387 ] DEBUG : Docker networks for project roottesttlsv13 are NETWORK ID NAME DRIVER SCOPE (cluster.py:633, print_all_docker_pieces) 2024-03-19 18:20:42 [ 387 ] DEBUG : Docker containers for project roottesttlsv13 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:641, print_all_docker_pieces) 2024-03-19 18:20:42 [ 387 ] DEBUG : Docker volumes for project roottesttlsv13 are DRIVER VOLUME NAME (cluster.py:649, print_all_docker_pieces) 2024-03-19 18:20:42 [ 387 ] DEBUG : Command:docker container list --all --filter name='^/roottesttlsv13_.*_1$' --format '{{.ID}}:{{.Names}}' (cluster.py:97, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Unstopped containers: {} (cluster.py:668, cleanup) 2024-03-19 18:20:42 [ 387 ] DEBUG : No running containers for project: roottesttlsv13 (cluster.py:682, cleanup) 2024-03-19 18:20:42 [ 387 ] DEBUG : Trying to prune unused networks... (cluster.py:688, cleanup) 2024-03-19 18:20:42 [ 387 ] DEBUG : Trying to prune unused images... (cluster.py:704, cleanup) 2024-03-19 18:20:42 [ 387 ] DEBUG : Command:['docker', 'image', 'prune', '-f'] (cluster.py:97, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:105, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Images pruned (cluster.py:707, cleanup) 2024-03-19 18:20:42 [ 387 ] DEBUG : Trying to prune unused volumes... (cluster.py:713, cleanup) 2024-03-19 18:20:42 [ 387 ] DEBUG : Command:['docker volume ls | wc -l'] (cluster.py:97, run_and_check) 2024-03-19 18:20:42 [ 387 ] DEBUG : Stdout:1 (cluster.py:105, run_and_check) ============================== slowest durations =============================== 15.08s setup test_tlsv1_3/test.py::test_create_user 3.10s teardown test_tlsv1_3/test.py::test_https_wrong_cert 0.17s call test_tlsv1_3/test.py::test_create_user 0.00s call test_tlsv1_3/test.py::test_https_wrong_cert 0.00s call test_tlsv1_3/test.py::test_https 0.00s call test_tlsv1_3/test.py::test_https_non_ssl_auth 0.00s teardown test_tlsv1_3/test.py::test_create_user 0.00s teardown test_tlsv1_3/test.py::test_https 0.00s teardown test_tlsv1_3/test.py::test_https_non_ssl_auth 0.00s setup test_tlsv1_3/test.py::test_https 0.00s setup test_tlsv1_3/test.py::test_https_wrong_cert 0.00s setup test_tlsv1_3/test.py::test_https_non_ssl_auth =========================== short test summary info ============================ FAILED test_tlsv1_3/test.py::test_create_user - urllib.error.URLError: subprocess.check_call(cmd, shell=True) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_d2wr8n --privileged --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-odbc-bridge:/clickhouse-odbc-bridge --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/_temp/test/build/clickhouse-library-bridge:/clickhouse-library-bridge --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/_temp/test/git-repo-copy/src/Server/grpc_protos:/ClickHouse/src/Server/grpc_protos --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e XTABLES_LOCKFILE=/run/host/xtables.lock -e PYTHONUNBUFFERED=1 -e DOCKER_DOTNET_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_HELPER_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_BASE_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBERIZED_HADOOP_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_KERBEROS_KDC_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_JS_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_MYSQL_PHP_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=0-0a8ac3b092733da37e3e2a0079c486938a36790d -e PYTEST_OPTS='--dist=loadfile -n 10 -rfEps --run-id=2 --color=no --durations=0 test_tlsv1_3/test.py::test_create_user test_tlsv1_3/test.py::test_https test_tlsv1_3/test.py::test_https_non_ssl_auth test_tlsv1_3/test.py::test_https_wrong_cert -vvv' altinityinfra/integration-tests-runner:0-0a8ac3b092733da37e3e2a0079c486938a36790d ' returned non-zero exit status 1.